OneStopTesting - Quality Testing Jobs, eBooks, Articles, FAQs, Training Institutes, Testing Software, Testing downloads, testing news, testing tools, learn testing, manual testing, automated testing, load runner, winrunner, test director, silk test, STLC

Forum| Contact Us| Testimonials| Sitemap| Employee Referrals| News| Articles| Feedback| Enquiry
 
Testing Resources
 
  • Testing Articles
  • Testing Books
  • Testing Certification
  • Testing FAQs
  • Testing Downloads
  • Testing Interview Questions
  • Career In Software Testing
  • Testing Jobs
  • Testing Job Consultants
  • Testing News
  • Testing Training Institutes
  •  
    Fundamentals
     
  • Introduction
  • Designing Test Cases
  • Developing Test Cases
  • Writing Test Cases
  • Test Case Templates
  • Purpose
  • What Is a Good Test Case?
  • Test Specifications
  • UML
  • Scenario Testing
  • Test Script
  • Test Summary Report
  • Test Data
  • Defect Tracking
  •  
    Software testing
     
  • Testing Forum
  • Introduction
  • Testing Start Process
  • Testing Stop Process
  • Testing Strategy
  • Risk Analysis
  • Software Listings
  • Test Metrics
  • Release Life Cycle
  • Interoperability Testing
  • Extreme Programming
  • Cyclomatic Complexity
  • Equivalence Partitioning
  • Error Guessing
  • Boundary Value Analysis
  • Traceability Matrix
  •  
    SDLC Models
     
  • Introduction
  • Waterfall Model
  • Iterative Model
  • V-Model
  • Spiral Model
  • Big Bang Model
  • RAD Model
  • Prototyping Model
  •  
    Software Testing Types
     
  • Static Testing
  • Dynamic Testing
  • Blackbox Testing
  • Whitebox Testing
  • Unit Testing
  • Requirements Testing
  • Regression Testing
  • Error Handling Testing
  • Manual support Testing
  • Intersystem Testing
  • Control Testing
  • Parallel Testing
  • Volume Testing
  • Stress Testing
  • Performance Testing
  • Agile Testing
  • Localization Testing
  • Globalization Testing
  • Internationalization Testing
  •  
    Test Plan
     
  • Introduction
  • Test Plan Development
  • Test Plan Template
  • Regional Differences
  • Criticism
  • Hardware Development
  • IEEE 829-1998
  • Testing Without a TestPlan
  •  
    Code Coverage
     
  • Introduction
  • Measures
  • Working
  • Statement Coverage
  • Branch Coverage
  • Path Coverage
  • Coverage criteria
  • Code coverage in practice
  • Tools
  • Features
  •  
    Quality Management
     
  • Introduction
  • Components
  • Capability Maturity Model
  • CMMI
  • Six Sigma
  •  
    Project Management
     
  • Introduction
  • PM Activities
  • Project Control Variables
  • PM Methodology
  • PM Phases
  • PM Templates
  • Agile PM
  •  
    Automated Testing Tools
     
  • Quick Test Professional
  • WinRunner
  • LoadRunner
  • Test Director
  • Silk Test
  • Test Partner
  • Rational Robot
  •  
    Performance Testing Tools
     
  • Apache JMeter
  • Rational Performance Tester
  • LoadRunner
  • NeoLoad
  • WAPT
  • WebLOAD
  • Loadster
  • OpenSTA
  • LoadUI
  • Appvance
  • Loadstorm
  • LoadImpact
  • QEngine
  • Httperf
  • CloudTest
  •  
    Languages
     
  • Perl Testing
  • Python Testing
  • JUnit Testing
  • Unix Shell Scripting
  •  
    Automation Framework
     
  • Introduction
  • Keyword-driven Testing
  • Data-driven Testing
  •  
    Configuration Management
     
  • History
  • What is CM?
  • Meaning of CM
  • Graphically Representation
  • Traditional CM
  • CM Activities
  • Tools
  •  
    Articles
     
  • What Is Software Testing?
  • Effective Defect Reports
  • Software Security
  • Tracking Defects
  • Bug Report
  • Web Testing
  • Exploratory Testing
  • Good Test Case
  • Write a Test
  • Code Coverage
  • WinRunner vs. QuickTest
  • Web Testing Tools
  • Automated Testing
  • Testing Estimation Process
  • Quality Assurance
  • The Interview Guide
  • Upgrade Path Testing
  • Priority and Severity of Bug
  • Three Questions About Bug
  •    
     
    Home » Testing Articles » Testing - General Articles » Agile Testing: Key Points for Unlearning

    Agile Testing: Key Points for Unlearning

    A D V E R T I S E M E N T


    When quality assurance teams and management who have adopted Agile practices first put the ideas to work, they face a significant impediment in unlearning the traditional mind-set and practices that experience in traditional practices has instilled in them.

    The following are some of the key aspects that need to be unlearned before attempting to deploy Agile practices, from a QA perspective:

    • The testing team needs to be independent and independently empowered in order to be effective.
    • Without a separate test strategy and test plan, it's tough to manage testing.
    • The V-model for verification and validation cannot be applied in an Agile sprint.
    • Independent testing teams don't do white-box testing.
    • The value of testing is realized only when defects are logged.
    • Automation is optional and is required only when regression testing is needed.
    • Testing nonfunctional aspects, such as performance of the system, is not possible in a sprint.
    • Testing must follow planning, specification, execution, and completion sequentially.
    • We don't have to write new test cases for detected defects.
    • Poorly written code is not the testing team's focus, as long as the code addresses the required functionality.
    • Test-process improvement models do not address aspects of Agile testing.

    Let's look at these assertions one by one.

    The testing team needs to be independent and independently empowered in order to be effective.

    Traditionally, testing teams have had followed different organizational styles: having no independent testers while developers perform the testing, having independent testers within the development teams, having independent testing performed by a separate division within the organization � even outsourcing independent testing. Often the testing team would like to be empowered and report directly to a senior project manager rather than to the development or the technical lead. The logic, or at least the perceived logic, is to allow the testing team to report and escalate technical defects without potential inhibitions from the technical lead.

    The Agile testing mind-set change that's required is that the testers are an integral part of an Agile team. Their focus is to deliver a quality shippable product at the end of each sprint and to achieve the "done" state for the backlog items committed without any technical debt. The testers report to the Agile team and are accountable to the product owner or the business.

    Without a separate test strategy and test plan, it's tough to manage testing.

    A test strategy document can typically be defined at the organizational level, the division or portfolio level, or even at the product level. Seldom must the test strategy be defined for each project, unless the project is large and the duration spans many years. The project-specific test approach is documented in the test plan for the project.

    In the case of Agile projects, the test approach can be documented in the release plan, and the sprint-specific testing activities during sprint planning. A separate test plan may not be required. However, having a test strategy at a level higher than the project could be useful, especially when the organization is undergoing transformation to Agile. The test strategy can define the Agile testing practices and the techniques to be followed across the organization or division; subsequently, Agile teams can adopt one or more of these practices while defining the test approach in the release plan for the particular project.

    The V-model for verification and validation cannot be applied in an Agile sprint.

    Within an Agile sprint, verification and validation are addressed by adopting Agile practices, such verifying whether INVEST criteria for documenting requirements is followed, creating and reviewing evocative documentation and simple design, reviewing visual modeling, holding daily stand-up meetings, reviewing radiator boards, following continuous integration, refactoring, running automated development tests and automated acceptance tests, holding focused reviews, and enhancing communication by having the product owner and customer on the team.

    The following figure shows an Agile V-model for verification and validation, as compared to the traditional V-model:

    Independent testing teams don't do white-box testing.

    Independent testing teams traditionally focus on black-box testing, possibly shrugging off any responsibility related to low-level testing. However, in Agile projects, testers play a significant role in automated development and acceptance tests. Agile testing is continuous and seldom staged. Agile testers need to understand the design and code-level aspects in order to effectively perform testing for a sprint. While the developers take the lead in unit testing, the Agile testing team shadows the low-level testing efforts and leads the automation aspect.

    The value of testing is realized only when defects are logged.

    While the value of testing lies in early detection of defects and ensuring that the shippable product is of good quality, Agile teams need to unlearn the defect numbers-game mind-set. Teams may perceive that more detected defects indicates better performance of the testing team. As a result, many cosmetic defects are logged.

    This should be managed. The Agile testing team directly contributes to the "done" state of the product backlog item, which essentially means that a backlog item cannot be considered done unless it passes testing. Agile testing teams must make use of the radiator boards to effectively radiate the information on the status of the backlog items.

    Automation is optional and is required only when regression testing is needed.

    Automation is not optional; it's an essential aspect, especially when the business is trying to improve the time to market for its products. Agile teams working at peak velocity adopt such practices as continuous integration, automated development tests, and automated acceptance tests. Without automation and application of tools, the team cannot achieve the desired agility.

    Testing nonfunctional aspects, such as performance of the system, is not possible in a sprint.

    Sometimes it may not be possible to perform testing of nonfunctional aspects, such as system performance, within a sprint. However, this can be addressed by having a separate release sprint during release planning. The release sprint can address the required nonfunctional testing and also perform a cycle of acceptance testing to ensure that the system works after any defect fixes. Rigorous integration testing may not be required if the system was continuously integrated and tested by leveraging automation.

    Testing must follow planning, specification, execution, and completion sequentially.

    The aspects of planning, specification, execution, and completion are highly relevant in Agile testing. However, we need to understand that Agile testing is continuous, not staged. While one backlog item may be marked "done," another item could be in its specification stages. Some teams follow the practice of updating a backlog item as "done" only when the test cases are automated for the backlog item.

    We don't have to write new test cases for detected defects.

    Traditionally, it hasn't been a practice for a test team to go back to specify a test case for a detected defect, especially for defects detected during exploratory testing. One of the key pain points for not doing so is the process of re-baselining the test case document and running around for signatures, since this is a change from the planned baseline. However, adapting to change is one of the Agile framework's foundational aspects. In Agile testing, new test cases are specified for detected defects that don't already have an associated test case, and the test case is subsequently included in the automation test cases suite.

    Poorly written code is not the testing team's focus, as long as the code addresses the required functionality.

    This is related to the point above ("Independent testing teams don't do white-box testing"), that independent test teams traditionally focus only on black-box testing and may be unconcerned with the quality of the code as long the code performs the required functionality. But the value add from the testing team can be significant if it can provide early feedback and also identify technical debt by focusing on the code-level aspects during verification and during validation or testing. This is one of the key Agile-testing mind-set changes required for a new Agile tester.

    Test-process improvement models do not address aspects of Agile testing.

    In fact, we do have the ability to measure and improve Agile testing, using standard industry models. Test-process improvement methods such as TPI NEXT advocate business-driven test process improvement in an Agile environment by prioritizing the key areas of focus. This facilitates an Agile testing mind-set by mapping Agile principles with specific, prioritized areas. TPI NEXT also provides specific "improvement suggestions" for the checkpoints in priority areas of Agile testing.

    Conclusion

    Although the task of performing testing is not very different in principle in Waterfall, iterative, or Agile, the Agile mind-set and its testing practices provide effective new means to achieve the desired results. The agility lies in the Agile practices, rather than in the overarching process itsel



    More Testing - General Articles
    1 2 3 4 5 6 7 8 9 10 11 Next



    discussionDiscussion Center
    Discuss
    Discuss

    Query

    Feedback
    Yahoo Groups
    Y! Group
    Sirfdosti Groups
    Sirfdosti
    Contact Us
    Contact

    Looking for Software Testing eBooks and Interview Questions? Join now and get it FREE!
     
    A D V E R T I S E M E N T
       
       

    Members Login


    Email ID:
    Password:


    Forgot Password
    New User
       
       
    Testing Interview Questions
  • General Testing
  • Automation Testing
  • Manual Testing
  • Software Development Life Cycle
  • Software Testing Life Cycle
  • Testing Models
  • Automated Testing Tools
  • Silk Test
  • Win Runner
  •    
       
    Testing Highlights

  • Software Testing Ebooks
  • Testing Jobs
  • Testing Frequently Asked Questions
  • Testing News
  • Testing Interview Questions
  • Testing Jobs
  • Testing Companies
  • Testing Job Consultants
  • ISTQB Certification Questions
  •    
       
    Interview Questions

  • WinRunner
  • LoadRunner
  • SilkTest
  • TestDirector
  • General Testing Questions
  •    
       
    Resources

  • Testing Forum
  • Downloads
  • E-Books
  • Testing Jobs
  • Testing Interview Questions
  • Testing Tools Questions
  • Testing Jobs
  • A-Z Knowledge
  •    
    Planning
    for
    Study ABROAD ?


    Study Abroad


    Vyom Network : Free SMS, GRE, GMAT, MBA | Online Exams | Freshers Jobs | Software Downloads | Programming & Source Codes | Free eBooks | Job Interview Questions | Free Tutorials | Jokes, Songs, Fun | Free Classifieds | Free Recipes | Bangalore Info | GATE Preparation | MBA Preparation | Free SAP Training
    Privacy Policy | Terms and Conditions
    Sitemap | Sitemap (XML)
    Job Interview Questions | Placement Papers | SMS Jokes | C++ Interview Questions | C Interview Questions | Web Hosting
    German | French | Portugese | Italian